Search Results

Documents authored by Miller, Dale


Document
Invited Talk
A Positive Perspective on Term Representation (Invited Talk)

Authors: Dale Miller and Jui-Hsuan Wu

Published in: LIPIcs, Volume 252, 31st EACSL Annual Conference on Computer Science Logic (CSL 2023)


Abstract
We use the focused proof system LJF as a framework for describing term structures and substitution. Since the proof theory of LJF does not pick a canonical polarization for primitive types, two different approaches to term representation arise. When primitive types are given the negative polarity, LJF proofs encode terms as tree-like structures in a familiar fashion. In this situation, cut elimination also yields the familiar notion of substitution. On the other hand, when primitive types are given the positive polarity, LJF proofs yield a structure in which explicit sharing of term structures is possible. Such a representation of terms provides an explicit method for sharing term structures. In this setting, cut elimination yields a different notion of substitution. We illustrate these two approaches to term representation by applying them to the encoding of untyped λ-terms. We also exploit concurrency theory techniques - namely traces and simulation - to compare untyped λ-terms using such different structuring disciplines.

Cite as

Dale Miller and Jui-Hsuan Wu. A Positive Perspective on Term Representation (Invited Talk). In 31st EACSL Annual Conference on Computer Science Logic (CSL 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 252, pp. 3:1-3:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{miller_et_al:LIPIcs.CSL.2023.3,
  author =	{Miller, Dale and Wu, Jui-Hsuan},
  title =	{{A Positive Perspective on Term Representation}},
  booktitle =	{31st EACSL Annual Conference on Computer Science Logic (CSL 2023)},
  pages =	{3:1--3:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-264-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{252},
  editor =	{Klin, Bartek and Pimentel, Elaine},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2023.3},
  URN =		{urn:nbn:de:0030-drops-174648},
  doi =		{10.4230/LIPIcs.CSL.2023.3},
  annote =	{Keywords: term representation, sharing, focused proof systems}
}
Document
Two Applications of Logic Programming to Coq

Authors: Matteo Manighetti, Dale Miller, and Alberto Momigliano

Published in: LIPIcs, Volume 188, 26th International Conference on Types for Proofs and Programs (TYPES 2020)


Abstract
The logic programming paradigm provides a flexible setting for representing, manipulating, checking, and elaborating proof structures. This is particularly true when the logic programming language allows for bindings in terms and proofs. In this paper, we make use of two recent innovations at the intersection of logic programming and proof checking. One of these is the foundational proof certificate (FPC) framework which provides a flexible means of defining the semantics of a range of proof structures for classical and intuitionistic logic. A second innovation is the recently released Coq-Elpi plugin for Coq in which the Elpi implementation of λProlog can send and retrieve information to and from the Coq kernel. We illustrate the use of both this Coq plugin and FPCs with two example applications. First, we implement an FPC-driven sequent calculus for a fragment of the Calculus of Inductive Constructions and we package it into a tactic to perform property-based testing of inductive types corresponding to Horn clauses. Second, we implement in Elpi a proof checker for first-order intuitionistic logic and demonstrate how proof certificates can be supplied by external (to Coq) provers and then elaborated into the fully detailed proof terms that can be checked by the Coq kernel.

Cite as

Matteo Manighetti, Dale Miller, and Alberto Momigliano. Two Applications of Logic Programming to Coq. In 26th International Conference on Types for Proofs and Programs (TYPES 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 188, pp. 10:1-10:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{manighetti_et_al:LIPIcs.TYPES.2020.10,
  author =	{Manighetti, Matteo and Miller, Dale and Momigliano, Alberto},
  title =	{{Two Applications of Logic Programming to Coq}},
  booktitle =	{26th International Conference on Types for Proofs and Programs (TYPES 2020)},
  pages =	{10:1--10:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-182-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{188},
  editor =	{de'Liguoro, Ugo and Berardi, Stefano and Altenkirch, Thorsten},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.TYPES.2020.10},
  URN =		{urn:nbn:de:0030-drops-138896},
  doi =		{10.4230/LIPIcs.TYPES.2020.10},
  annote =	{Keywords: Proof assistants, logic programming, Coq, \lambdaProlog, property-based testing}
}
Document
Invited Paper
Mechanized Metatheory Revisited: An Extended Abstract (Invited Paper)

Authors: Dale Miller

Published in: LIPIcs, Volume 97, 22nd International Conference on Types for Proofs and Programs (TYPES 2016)


Abstract
Proof assistants and the programming languages that implement them need to deal with a range of linguistic expressions that involve bindings. Since most mature proof assistants do not have built-in methods to treat this aspect of syntax, many of them have been extended with various packages and libraries that allow them to encode bindings using, for example, de Bruijn numerals and nominal logic features. I put forward the argument that bindings are such an intimate aspect of the structure of expressions that they should be accounted for directly in the underlying programming language support for proof assistants and not added later using packages and libraries. One possible approach to designing programming languages and proof assistants that directly supports such an approach to bindings in syntax is presented. The roots of such an approach can be found in the mobility of binders between term-level bindings, formula-level bindings (quantifiers), and proof-level bindings (eigenvariables). In particular, by combining Church's approach to terms and formulas (found in his Simple Theory of Types) and Gentzen's approach to sequent calculus proofs, we can learn how bindings can declaratively interact with the full range of logical connectives and quantifiers. I will also illustrate how that framework provides an intimate and semantically clean treatment of computation and reasoning with syntax containing bindings. Some implemented systems, which support this intimate and built-in treatment of bindings, will be briefly described.

Cite as

Dale Miller. Mechanized Metatheory Revisited: An Extended Abstract (Invited Paper). In 22nd International Conference on Types for Proofs and Programs (TYPES 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 97, pp. 1:1-1:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{miller:LIPIcs.TYPES.2016.1,
  author =	{Miller, Dale},
  title =	{{Mechanized Metatheory Revisited: An Extended Abstract}},
  booktitle =	{22nd International Conference on Types for Proofs and Programs (TYPES 2016)},
  pages =	{1:1--1:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-065-1},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{97},
  editor =	{Ghilezan, Silvia and Geuvers, Herman and Ivetic, Jelena},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.TYPES.2016.1},
  URN =		{urn:nbn:de:0030-drops-98603},
  doi =		{10.4230/LIPIcs.TYPES.2016.1},
  annote =	{Keywords: mechanized metatheory, mobility of binders, lambda-tree syntax, higher-order abstract syntax}
}
Document
Complete Volume
LIPIcs, Volume 84, FSCD'17, Complete Volume

Authors: Dale Miller

Published in: LIPIcs, Volume 84, 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)


Abstract
LIPIcs, Volume 84, FSCD'17, Complete Volume

Cite as

2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@Proceedings{miller:LIPIcs.FSCD.2017,
  title =	{{LIPIcs, Volume 84, FSCD'17, Complete Volume}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017},
  URN =		{urn:nbn:de:0030-drops-79063},
  doi =		{10.4230/LIPIcs.FSCD.2017},
  annote =	{Keywords: Theory of Computation, Computation by Abstract Devices, Analysis of Algorithms and Problem Complexity, Logics and Meanings of Programs, Mathematical Logic and Formal Languages, Programming Techniques, Software/Program Verification, Programming Languages, Deduction and Theorem Proving}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers

Authors: Dale Miller

Published in: LIPIcs, Volume 84, 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)


Abstract
Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers

Cite as

2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 84, pp. 0:i-0:xiv, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{miller:LIPIcs.FSCD.2017.0,
  author =	{Miller, Dale},
  title =	{{Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers}},
  booktitle =	{2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017)},
  pages =	{0:i--0:xiv},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-047-7},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{84},
  editor =	{Miller, Dale},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2017.0},
  URN =		{urn:nbn:de:0030-drops-77126},
  doi =		{10.4230/LIPIcs.FSCD.2017.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Steering Committee, Program Committee, External Reviewers}
}
Document
Separating Functional Computation from Relations

Authors: Ulysse Gérard and Dale Miller

Published in: LIPIcs, Volume 82, 26th EACSL Annual Conference on Computer Science Logic (CSL 2017)


Abstract
The logical foundation of arithmetic generally starts with a quantificational logic over relations. Of course, one often wishes to have a formal treatment of functions within this setting. Both Hilbert and Church added choice operators (such as the epsilon operator) to logic in order to coerce relations that happen to encode functions into actual functions. Others have extended the term language with confluent term rewriting in order to encode functional computation as rewriting to a normal form. We take a different approach that does not extend the underlying logic with either choice principles or with an equality theory. Instead, we use the familiar two-phase construction of focused proofs and capture functional computation entirely within one of these phases. As a result, our logic remains purely relational even when it is computing functions.

Cite as

Ulysse Gérard and Dale Miller. Separating Functional Computation from Relations. In 26th EACSL Annual Conference on Computer Science Logic (CSL 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 82, pp. 23:1-23:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{gerard_et_al:LIPIcs.CSL.2017.23,
  author =	{G\'{e}rard, Ulysse and Miller, Dale},
  title =	{{Separating Functional Computation from Relations}},
  booktitle =	{26th EACSL Annual Conference on Computer Science Logic (CSL 2017)},
  pages =	{23:1--23:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-045-3},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{82},
  editor =	{Goranko, Valentin and Dam, Mads},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2017.23},
  URN =		{urn:nbn:de:0030-drops-77040},
  doi =		{10.4230/LIPIcs.CSL.2017.23},
  annote =	{Keywords: focused proof systems, fixed points, computation and deduction}
}
Document
Functions-as-Constructors Higher-Order Unification

Authors: Tomer Libal and Dale Miller

Published in: LIPIcs, Volume 52, 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)


Abstract
Unification is a central operation in the construction of a range of computational logic systems based on first-order and higher-order logics. First-order unification has a number of properties that dominates the way it is incorporated within such systems. In particular, first-order unification is decidable, unary, and can be performed on untyped term structures. None of these three properties hold for full higher-order unification: unification is undecidable, unifiers can be incomparable, and term-level typing can dominate the search for unifiers. The so-called pattern subset of higher-order unification was designed to be a small extension to first-order unification that respected the basic laws governing lambda-binding (the equalities of alpha, beta, and eta-conversion) but which also satisfied those three properties. While the pattern fragment of higher-order unification has been popular in various implemented systems and in various theoretical considerations, it is too weak for a number of applications. In this paper, we define an extension of pattern unification that is motivated by some existing applications and which satisfies these three properties. The main idea behind this extension is that the arguments to a higher-order, free variable can be more than just distinct bound variables: they can also be terms constructed from (sufficient numbers of) such variables using term constructors and where no argument is a subterm of any other argument. We show that this extension to pattern unification satisfies the three properties mentioned above.

Cite as

Tomer Libal and Dale Miller. Functions-as-Constructors Higher-Order Unification. In 1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 52, pp. 26:1-26:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{libal_et_al:LIPIcs.FSCD.2016.26,
  author =	{Libal, Tomer and Miller, Dale},
  title =	{{Functions-as-Constructors Higher-Order Unification}},
  booktitle =	{1st International Conference on Formal Structures for Computation and Deduction (FSCD 2016)},
  pages =	{26:1--26:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-010-1},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{52},
  editor =	{Kesner, Delia and Pientka, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2016.26},
  URN =		{urn:nbn:de:0030-drops-59810},
  doi =		{10.4230/LIPIcs.FSCD.2016.26},
  annote =	{Keywords: higher-order unification, pattern unification}
}
Document
A Systematic Approach to Canonicity in the Classical Sequent Calculus

Authors: Kaustuv Chaudhuri, Stefan Hetzl, and Dale Miller

Published in: LIPIcs, Volume 16, Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL (2012)


Abstract
The sequent calculus is often criticized for requiring proofs to contain large amounts of low-level syntactic details that can obscure the essence of a given proof. Because each inference rule introduces only a single connective, sequent proofs can separate closely related steps-such as instantiating a block of quantifiers-by irrelevant noise. Moreover, the sequential nature of sequent proofs forces proof steps that are syntactically non-interfering and permutable to nevertheless be written in some arbitrary order. The sequent calculus thus lacks a notion of canonicity: proofs that should be considered essentially the same may not have a common syntactic form. To fix this problem, many researchers have proposed replacing the sequent calculus with proof structures that are more parallel or geometric. Proof-nets, matings, and atomic flows are examples of such revolutionary formalisms. We propose, instead, an evolutionary approach to recover canonicity within the sequent calculus, which we illustrate for classical first-order logic. The essential element of our approach is the use of a multi-focused sequent calculus as the means of abstracting away the details from classical cut-free sequent proofs. We show that, among the multi-focused proofs, the maximally multi-focused proofs that make the foci as parallel as possible are canonical. Moreover, such proofs are isomorphic to expansion proofs - a well known, minimalistic, and parallel generalization of Herbrand disjunctions - for classical first-order logic. This technique is a systematic way to recover the desired essence of any sequent proof without abandoning the sequent calculus.

Cite as

Kaustuv Chaudhuri, Stefan Hetzl, and Dale Miller. A Systematic Approach to Canonicity in the Classical Sequent Calculus. In Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL. Leibniz International Proceedings in Informatics (LIPIcs), Volume 16, pp. 183-197, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@InProceedings{chaudhuri_et_al:LIPIcs.CSL.2012.183,
  author =	{Chaudhuri, Kaustuv and Hetzl, Stefan and Miller, Dale},
  title =	{{A Systematic Approach to Canonicity in the Classical Sequent Calculus}},
  booktitle =	{Computer Science Logic (CSL'12) - 26th International Workshop/21st Annual Conference of the EACSL},
  pages =	{183--197},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-42-2},
  ISSN =	{1868-8969},
  year =	{2012},
  volume =	{16},
  editor =	{C\'{e}gielski, Patrick and Durand, Arnaud},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2012.183},
  URN =		{urn:nbn:de:0030-drops-36723},
  doi =		{10.4230/LIPIcs.CSL.2012.183},
  annote =	{Keywords: Sequent Calculus, Canonicity, Classical Logic, Expansion Trees}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail